AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Billion-scale Time Series Forecasting

# Billion-scale Time Series Forecasting

Timemoe 200M
Apache-2.0
TimeMoE-200M is a billion-scale time series foundation model based on the Mixture of Experts (MoE) architecture, focusing on time series forecasting tasks.
Climate Model
T
Maple728
14.01k
7
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase